90 research outputs found

    Fragment and Replicate Algorithms for Non-Equi-Join Evaluation on Smart Disks

    Get PDF
    Abstract-The predicates in a non-equi-join can be anything but equality relations. Non-equi-join predicates can be as simple as an inequality expression between two join relation fields, or as complex as a user-defined function that carries out arbitrary complex comparisons. The nature of non-equi-join calls for predicate evaluation over all possible combinations of tuples in a two-way join. In this paper, we consider the family of fragment and replicate join algorithms that facilitates non-equijoin evaluation and adapt it in a Smart Disk environment. We use Smart Disk as an umbrella term for a variety of different storage devices featuring an embedded processor that may offload data processing from the main CPU. Our approach partially replicates one of the join relations in order to harness all processing capacity in the system. However, partial replication introduces problems with synchronizing concurrent algorithmic steps, load balancing, and selection among different join evaluation alternatives. We use a processing model to avoid performance pitfalls and autonomously select algorithm parameters. Through experimentation we find our proposed algorithms to utilize all system resources and, thus, yield better performance. Index Terms-database join, non-equi-joins, smart disks, active disks, fragment and replicate parallelism, array of disk

    Digital Object Prototypes: An Effective Realization of Digital Object Types

    Full text link
    Abstract. Digital Object Prototypes (DOPs) provide the DL designer with the ability to model diverse types of digital objects in a uniform manner while offering digital object type conformance; objects conform to the designer’s type definitions automatically. In this paper, we outline how DOPs effectively capture and express digital object typing infor-mation and finally assist in the development of unified web-based DL services such as adaptive cataloguing, batch digital object ingestion and automatic digital content conversions. In contrast, conventional DL ser-vices require custom implementations for each different type of material.

    Nefeli: Hint-Based Execution of Workloads in Clouds

    Full text link
    Abstract—Virtualization of computer systems has made feasi-ble the provision of entire distributed infrastructures in the form of services. Such services do not expose the internal operational and physical characteristics of the underlying machinery to either users or applications. In this way, infrastructures including computers in data-centers, clusters of workstations, and networks of machines are shrouded in “clouds”. Mainly through the deployment of virtual machines, such networks of computing nodes become cloud-computing environments. In this paper, we propose Nefeli, a virtual infrastructure gateway that is capable of effectively handling diverse workloads of jobs in cloud environments. By and large, users and their workloads remain agnostic to the internal features of clouds at all times. Exploiting execution patterns as well as logistical constraints, users provide Nefeli with hints for the handling of their jobs. Hints provide no hard requirements for application deployment in terms of pairing virtual-machines to specific physical cloud elements. Nefeli helps avoid bottlenecks within the cloud through the realization of viable virtual machine deployment mappings. As the types of jobs change over time, deployment mappings must follow suit. To this end, Nefeli offers mechanisms to migrate virtual machines as needed to adapt to changing performance needs. Using our prototype system, we show significant improvements in overall time needed and energy consumed for the execution of workloads in both simulated and real cloud computing environments. I

    An Analysis of Error in a Reuse-Oriented Development Environment

    Get PDF
    Component reuse is widely considered vital for obtaining significant improvement in development productivity. However, as an organization adopts a reuse-oriented development process,the nature of the problems in development is likely to change. In this paper, we use a measurement--based approach to better understand and evaluate an evolving reuse process. More specifically, we study the effects of reuse across seven projects in narrow domain from a single development organization. An analysis of the errors that occur in new and reused components across all phases of system development provides insight into the factors influencing the reuse process. We found significant differences between errors associated with new and various types of reused components in terms of the types of errors committed, when errors are introduced, and the effect that the errors have on the development process. (Also cross-referenced as UMIACS-TR-95-24

    06431 Working Group Summary: Atomicity in Mobile Networks

    Get PDF
    We introduce different mobile network applications and show to which degree the concept of database transactions is required within the applications. We show properties of transaction processing and explain which properties are important for each of the mobile applications. Furthermore, we discuss open questions regarding transaction processing in mobile networks and identify open problems for further research

    Outlier-Aware Data Aggregation in Sensor Networks

    Full text link
    Abstract- In this paper we discuss a robust aggregation framework that can detect spurious measurements and refrain from incorporating them in the computed aggregate values. Our framework can consider different definitions of an outlier node, based on a specified minimum support. Our experimental evaluation demonstrates the benefits of our approach. I

    Another Outlier Bites the Dust: Computing Meaningful Aggregates in Sensor Networks

    Full text link
    Abstract — Recent work has demonstrated that readings pro-vided by commodity sensor nodes are often of poor quality. In order to provide a valuable sensory infrastructure for monitoring applications, we first need to devise techniques that can withstand “dirty ” and unreliable data during query processing. In this paper we present a novel aggregation framework that detects suspicious measurements by outlier nodes and refrains from incorporating such measurements in the computed aggregate values. We consider different definitions of an outlier node, based on the notion of a user-specified minimum support, and discuss techniques for properly routing messages in the network in order to reduce the bandwidth consumption and the energy drain during the query evaluation. In our experiments using real and synthetic traces we demonstrate that: (i) a straightfor-ward evaluation of a user aggregate query leads to practically meaningless results due to the existence of outliers; (ii) our techniques can detect and eliminate spurious readings without any application specific knowledge of what constitutes normal behavior; (iii) the identification of outliers, when performed inside the network, significantly reduces bandwidth and energy drain compared to alternative methods that centrally collect and analyze all sensory data; and (iv) we can significantly reduce the cost of the aggregation process by utilizing simple statistics on outlier nodes and reorganizing accordingly the collection tree. I

    The Toronto prehospital hypertonic resuscitation-head injury and multi organ dysfunction trial (TOPHR HIT) - Methods and data collection tools

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Clinical trials evaluating the use of hypertonic saline in the treatment of hypovolemia and head trauma suggest no survival superiority over normal saline; however subgroup analyses suggest there may be a reduction in the inflammatory response and multiorgan failure which may lead to better survival and enhanced neurocognitive function. We describe a feasibility study of randomizing head injured patients to hypertonic saline and dextran vs. normal saline administration in the out of hospital setting.</p> <p>Methods/Design</p> <p>This feasibility study employs a randomized, placebo-controlled design evaluating normal saline compared with a single dose of 250 ml of 7.5% hypertonic saline in 6% dextran 70 in the management of traumatic brain injuries. The primary feasibility endpoints of the trial were: 1) baseline survival rates for the treatment and control group to aid in the design of a definitive multicentre trial, 2) randomization compliance rate, 3) ease of protocol implementation in the out-of-hospital setting, and 4) adverse event rate of HSD infusion.</p> <p>The secondary objectives include measuring the effect of HSD in modulating the immuno-inflammatory response to severe head injury and its effect on modulating the release of neuro-biomarkers into serum; evaluating the role of serum neuro-biomarkers in predicting patient outcome and clinical response to HSD intervention; evaluating effects of HSD on brain atrophy post-injury and neurocognitive and neuropsychological outcomes.</p> <p>Discussion</p> <p>We anticipate three aspects of the trial will present challenges to trial success; ethical demands associated with a waiver of consent trial, challenging follow up and comprehensive accurate timely data collection of patient identifiers and clinical or laboratory values. In addition all the data collection tools had to be derived de novo as none existed in the literature.</p> <p>Trial registration number</p> <p>NCT00878631</p

    Infected pancreatic necrosis: outcomes and clinical predictors of mortality. A post hoc analysis of the MANCTRA-1 international study

    Get PDF
    : The identification of high-risk patients in the early stages of infected pancreatic necrosis (IPN) is critical, because it could help the clinicians to adopt more effective management strategies. We conducted a post hoc analysis of the MANCTRA-1 international study to assess the association between clinical risk factors and mortality among adult patients with IPN. Univariable and multivariable logistic regression models were used to identify prognostic factors of mortality. We identified 247 consecutive patients with IPN hospitalised between January 2019 and December 2020. History of uncontrolled arterial hypertension (p = 0.032; 95% CI 1.135-15.882; aOR 4.245), qSOFA (p = 0.005; 95% CI 1.359-5.879; aOR 2.828), renal failure (p = 0.022; 95% CI 1.138-5.442; aOR 2.489), and haemodynamic failure (p = 0.018; 95% CI 1.184-5.978; aOR 2.661), were identified as independent predictors of mortality in IPN patients. Cholangitis (p = 0.003; 95% CI 1.598-9.930; aOR 3.983), abdominal compartment syndrome (p = 0.032; 95% CI 1.090-6.967; aOR 2.735), and gastrointestinal/intra-abdominal bleeding (p = 0.009; 95% CI 1.286-5.712; aOR 2.710) were independently associated with the risk of mortality. Upfront open surgical necrosectomy was strongly associated with the risk of mortality (p &lt; 0.001; 95% CI 1.912-7.442; aOR 3.772), whereas endoscopic drainage of pancreatic necrosis (p = 0.018; 95% CI 0.138-0.834; aOR 0.339) and enteral nutrition (p = 0.003; 95% CI 0.143-0.716; aOR 0.320) were found as protective factors. Organ failure, acute cholangitis, and upfront open surgical necrosectomy were the most significant predictors of mortality. Our study confirmed that, even in a subgroup of particularly ill patients such as those with IPN, upfront open surgery should be avoided as much as possible. Study protocol registered in ClinicalTrials.Gov (I.D. Number NCT04747990)
    • 

    corecore